Web Survey Bibliography
Relevance & Research Question: One of the most significant decisions when designing survey questions is whether the questions will be posed as closed-ended or open-ended. Closed-ended questions require respondents to choose from a set of provided response-options, while open-ended questions are answered by respondents in their own words. Open-ended questions offer the benefit of not constraining responses and allowing respondents to freely answer and elaborate upon their responses. Narrative open-ended questions are especially useful when there are no suitable answer categories available for a closed-ended question format, or if providing response options might bias the respondents. Open-ended questions are also powerful tools for collecting more detailed and specific responses from large samples of respondents. However, open-ended questions are also burdensome to answer and suffer from higher rates of item-nonresponse. This thesis aims to improve narrative open-ended questions in Web surveys by using visual and adaptive questionnaire design. Previous research on open-ended questions demonstrated that respondents react to the size and design of the answer box offered with an open-ended question in Web surveys. Larger answer boxes seem to pose an additional burden as compared to smaller answer boxes. At the same time larger answer boxes work as a stimulus that increases the length of the response provided by those respondents who actually answer the question. By varying the visual design of answer-boxes this thesis seeks ways to improve narrative open-ended questions. Despite the influence of different answer-box sizes, the effectiveness of a counter associated with the answer box is tested. In addition dynamic in size growing answer-boxes were compared to answer-boxes that where adjusted in size by respondents themselves. Besides varying the visual appearance of narrative open-ended questions and the answer-boxes used, the interactive nature of the internet allows a multiplicity of ways to integrate interactive features into a survey. It is possible to adapt Web surveys individually to groups of respondents. Based on previous answers it is feasible to provide specific designed questions to engage respondents. This thesis puts two adaptive design approaches to improve narrative open-ended questions to a test.
Methods & Data: Despite the influence of three different answer-box sizes, the effectiveness of a counter associated with the answer box that continuously indicates the number of characters left to type is tested. In addition dynamic in size growing answer-boxes were compared to answer-boxes that where adjusted in size by respondents themselves by a plus or minus button. Further this thesis puts two adaptive design approaches to improve narrative open-ended questions to a test. The amount of information respondents typed into the response box of an initial open-ended question was used to assign them later in the survey to a custom-size answer box. In addition a follow-up probe was tested were respondents who didn’t respond to a narrative open-ended question were assigned to the same question in a closed format to get at least some information from them. All experiments were embedded in large scale surveys among university applicants or students.
Results: While larger answer-boxes were expected to pose an additional burden, we found no influence of the answer-box size on item-nonresponse. Using a counter indicating the number of characters left curtailed the response length if the default counter value was set low, and increased the response length when the default value was set high. However, a low-value counter limited the number of words used by respondents, but not the number of topics reported. Respondents always seemed to report what they intended to report. Automatically growing answer-box designs do not improve response length or the number of topics reported to narrative open-ended question. In the respondent-adjusted design, respondents were able to set the answer-box size themselves. Since they were aware of the box-size adjustment, the design corresponds better with the question–answer process, as compared to the dynamic growing answer-spaces. As a result, respondents reported more topics and produced longer responses with the self-adjusted answer-box design. Adapting individually-sized answer-boxes increased only the length of responses to narrative open-ended questions. The answer-box size does not seem to pose a higher burden to respond. As in the visual design experiments, responses can be improved using the adaptive answer-box size assignment, but the willingness to respond is not affected by any of the designs tested. In order to improve response rates, the final experiment in this thesis used a closed-ended follow-up probe to combine the strengths of closed- and open-ended questions. Switching to a closed-ended question is not ideal, but the design accomplished the aim of getting at least some information from former non-respondents. In the initial open-ended question, respondents provided fewer topics but elaborated them. In the closed-ended follow-up probe the respondents checked more answer categories respective to topics in the open-ended question, most likely because they were at hand. Overall, the probe succeeded in getting information from those respondents who neglected to answer the same question in an open format.
Added Value: Overall, the visual design experiments demonstrate that it is well worth paying attention to the visual and adaptive design of open-ended questions in Web surveys, and that well-designed open-ended questions are a powerful tool for collecting specific data from large samples of respondents. Further results provide preliminary support for the effectiveness of a Web survey design that adapts the type and visual design of survey questions to the motivation and capabilities of the respondent. While previous studies in the design of open-ended narrative questions aimed to enhance the effectiveness of design features that were meant to influence response behavior (in particular of less-motivated respondents), the adaptive design changes the questionnaire in order to get the most out of the respondent, consistent with his motivation and capabilities.
Web survey bibliography - Germany (361)
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Comparing the same Questionnaire between five Online Panels: A Study of the Effect of Recruitment Strategy...; 2017; Schnell, R.; Panreck, L.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- Social Desirability and Undesirability Effects on Survey Response latencies; 2017; Andersen, H.; Mayerl, J.
- Comparison of response patterns in different survey designs: a longitudinal panel with mixed-mode and...; 2017; Ruebsamen, N.; Akmatov, M. K.; Castell, S.; Karch, A.; Mikolajczyk, R. T.
- Mobile Research im Kontext der digitalen Transformation; 2017; Friedrich-Freksa, M.
- Kognitives Pretesting; 2017; Neuert, C.
- Grundzüge des Datenschutzrechts und aktuelle Datenschutzprobleme in der Markt- und Sozialforschung; 2017; Schweizer, A.
- Article Establishing an Open Probability-Based Mixed-Mode Panel of the General Population in Germany...; 2017; Bosnjak, M.; Dannwolf, T.; Enderle, T.; Schaurer, I.; Struminskaya, B.; Tanner, A.; Weyandt, K.
- Socially Desirable Responding in Web-Based Questionnaires: A Meta-Analytic Review of the Candor Hypothesis...; 2016; Gnambs, T.; Kaspar, K.
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Predicting and Preventing Break-Offs in Web Surveys; 2016; Mittereder, F.
- Incorporating eye tracking into cognitive interviewing to pretest survey questions; 2016; Neuert, C.; Lenzner, T.
- Geht’s auch mit der Maus? – Eine Methodenstudie zu Online-Befragungen in der Jugendforschung...; 2016; Heim, R.; Konowalczyk, S.; Grgic, M.; Seyda, M.; Burrmann, U.; Rauschenbach, T.
- Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?; 2016; Meitinger, K., Behr, D.
- Device Effects - How different screen sizes affect answers in online surveys; 2016; Fisher, B.; Bernet, F.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment; 2016; Hoehne, J. K.; Schlosser, S.; Krebs, D.
- Secondary Respondent Consent in the German Family Panel; 2016; Schmiedeberg, C.; Castiglioni, L.; Schroeder, J.
- Does Changing Monetary Incentive Schemes in Panel Studies Affect Cooperation? A Quasi-experiment on...; 2016; Schaurer, I.; Bosnjak, M.
- Using Cash Incentives to Help Recruitment in a Probability Based Web Panel: The Effects on Sign Up Rates...; 2016; Krieger, U.
- The Mobile Web Only Population: Socio-demographic Characteristics and Potential Bias ; 2016; Fuchs, M.; Metzler, A.
- The Impact of Scale Direction, Alignment and Length on Responses to Rating Scale Questions in a Web...; 2016; Keusch, F.; Liu, M.; Yan, T.
- Web Surveys Versus Other Survey Modes: An Updated Meta-analysis Comparing Response Rates ; 2016; Wengrzik, J.; Bosnjak, M.; Lozar Manfreda, K.
- Retrospective Measurement of Students’ Extracurricular Activities with a Self-administered Calendar...; 2016; Furthmueller, P.
- Privacy Concerns in Responses to Sensitive Questions. A Survey Experiment on the Influence of Numeric...; 2016; Bader, F., Bauer, J., Kroher, M., Riordan, P.
- Ballpoint Pens as Incentives with Mail Questionnaires – Results of a Survey Experiment; 2016; Heise, M.
- Does survey mode matter for studying electoral behaviour? Evidence from the 2009 German Longitudinal...; 2016; Bytzek, E.; Bieber, I. E.
- Forecasting proportional representation elections from non-representative expectation surveys; 2016; Graefe, A.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- Respondent Conditioning in Online Panel Surveys: Results of Two Field Experiments; 2016; Struminskaya, B.
- A Privacy-Friendly Method to Reward Participants of Online-Surveys; 2015; Herfert, M.; Lange, B.; Selzer, A.; Waldmann, U.
- The impact of frequency rating scale formats on the measurement of latent variables in web surveys -...; 2015; Menold, N.; Kemper, C. J.
- Investigating response order effects in web surveys using eye tracking; 2015; Karem Hoehne, J.; Lenzner, T.
- Implementation of the forced answering option within online surveys: Do higher item response rates come...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- Translating Answers to Open-ended Survey Questions in Cross-cultural Research: A Case Study on the Interplay...; 2015; Behr, D.
- The Effects of Questionnaire Completion Using Mobile Devices on Data Quality. Evidence from a Probability...; 2015; Bosnjak, M.; Struminskaya, B.; Weyandt, K.
- Are they willing to use the web? First results of a possible switch from PAPI to CAPI/CAWI in an establishment...; 2015; Ellguth, P.; Kohaut, S.
- Measuring Political Knowledge in Web-Based Surveys: An Experimental Validation of Visual Versus Verbal...; 2015; Munzert, S.; Selb, P.
- Changing from CAPI to CAWI in an ongoing household panel - experiences from the German Socio-Economic...; 2015; Schupp, J.; Sassenroth, D.
- Rating Scales in Web Surveys: A Test of New Drag-and-Drop Rating Procedures; 2015; Kunz, T.
- Mode System Effects in an Online Panel Study: Comparing a Probability-based Online Panel with two Face...; 2015; Struminskaya, B.; De Leeuw, E. D.; Kaczmirek, L.
- Higher response rates at the expense of validity? Consequences of the implementation of the ‘forced...; 2015; Decieux, J. P.; Mergener, A.; Neufang, K.; Sischka, P.
- A quasi-experiment on effects of prepaid versus promised incentives on participation in a probability...; 2015; Schaurer, I.; Bosnjak, M.
- Response Effects of Prenotification, Prepaid Cash, Prepaid Vouchers, and Postpaid Vouchers: An Experimental...; 2015; van Veen, F.; Goeritz, A.; Sattler, S.
- Recruiting Respondents for a Mobile Phone Panel: The Impact of Recruitment Question Wording on Cooperation...; 2015; Busse, B.; Fuchs, M.
- The Influence of the Answer Box Size on Item Nonresponse to Open-Ended Questions in a Web Survey ; 2015; Zuell, C.; Menold, N.; Koerber, S.